141 research outputs found

    Simulating Ability: Representing Skills in Games

    Full text link
    Throughout the history of games, representing the abilities of the various agents acting on behalf of the players has been a central concern. With increasingly sophisticated games emerging, these simulations have become more realistic, but the underlying mechanisms are still, to a large extent, of an ad hoc nature. This paper proposes using a logistic model from psychometrics as a unified mechanism for task resolution in simulation-oriented games

    Set Theory and its Place in the Foundations of Mathematics:a new look at an old question

    Get PDF
    This paper reviews the claims of several main-stream candidates to be the foundations of mathematics, including set theory. The review concludes that at this level of mathematical knowledge it would be very unreasonable to settle with any one of these foundations and that the only reasonable choice is a pluralist one

    Phase transition in the Jarzynski estimator of free energy differences

    Full text link
    The transition between a regime in which thermodynamic relations apply only to ensembles of small systems coupled to a large environment and a regime in which they can be used to characterize individual macroscopic systems is analyzed in terms of the change in behavior of the Jarzynski estimator of equilibrium free energy differences from nonequilibrium work measurements. Given a fixed number of measurements, the Jarzynski estimator is unbiased for sufficiently small systems. In these systems, the directionality of time is poorly defined and configurations that dominate the empirical average, but which are in fact typical of the reverse process, are sufficiently well sampled. As the system size increases the arrow of time becomes better defined. The dominant atypical fluctuations become rare and eventually cannot be sampled with the limited resources that are available. Asymptotically, only typical work values are measured. The Jarzynski estimator becomes maximally biased and approaches the exponential of minus the average work, which is the result that is expected from standard macroscopic thermodynamics. In the proper scaling limit, this regime change can be described in terms of a phase transition in variants of the random energy model (REM). This correspondence is explicitly demonstrated in several examples of physical interest: near-equilibrium processes in which the work distribution is Gaussian, the sudden compression of an ideal gas and adiabatic quasi-static volume changes in a dilute real gas.Comment: 29 pages, 5 figures, accepted for publication in Physical Review E (2012

    The Ehrenfest urn revisited: Playing the game on a realistic fluid model

    Get PDF
    The Ehrenfest urn process, also known as the dogs and fleas model, is realistically simulated by molecular dynamics of the Lennard-Jones fluid. The key variable is Delta z, i.e. the absolute value of the difference between the number of particles in one half of the simulation box and in the other half. This is a pure-jump stochastic process induced, under coarse graining, by the deterministic time evolution of the atomic coordinates. We discuss the Markov hypothesis by analyzing the statistical properties of the jumps and of the waiting times between jumps. In the limit of a vanishing integration time-step, the distribution of waiting times becomes closer to an exponential and, therefore, the continuous-time jump stochastic process is Markovian. The random variable Delta z behaves as a Markov chain and, in the gas phase, the observed transition probabilities follow the predictions of the Ehrenfest theory.Comment: Accepted by Physical Review E on 4 May 200

    A new foundational crisis in mathematics, is it really happening?

    Full text link
    The article reconsiders the position of the foundations of mathematics after the discovery of HoTT. Discussion that this discovery has generated in the community of mathematicians, philosophers and computer scientists might indicate a new crisis in the foundation of mathematics. By examining the mathematical facts behind HoTT and their relation with the existing foundations, we conclude that the present crisis is not one. We reiterate a pluralist vision of the foundations of mathematics. The article contains a short survey of the mathematical and historical background needed to understand the main tenets of the foundational issues.Comment: Final versio

    Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond

    Full text link
    Many historians of the calculus deny significant continuity between infinitesimal calculus of the 17th century and 20th century developments such as Robinson's theory. Robinson's hyperreals, while providing a consistent theory of infinitesimals, require the resources of modern logic; thus many commentators are comfortable denying a historical continuity. A notable exception is Robinson himself, whose identification with the Leibnizian tradition inspired Lakatos, Laugwitz, and others to consider the history of the infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies, Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly demonstrating the inconsistency of reasoning with historical infinitesimal magnitudes. We argue that Robinson, among others, overestimates the force of Berkeley's criticisms, by underestimating the mathematical and philosophical resources available to Leibniz. Leibniz's infinitesimals are fictions, not logical fictions, as Ishiguro proposed, but rather pure fictions, like imaginaries, which are not eliminable by some syncategorematic paraphrase. We argue that Leibniz's defense of infinitesimals is more firmly grounded than Berkeley's criticism thereof. We show, moreover, that Leibniz's system for differential calculus was free of logical fallacies. Our argument strengthens the conception of modern infinitesimals as a development of Leibniz's strategy of relating inassignable to assignable quantities by means of his transcendental law of homogeneity.Comment: 69 pages, 3 figure

    Henri Poincaré: The Status of Mechanical Explanations and the Foundations of Statistical Mechanics

    Get PDF
    The first goal of this paper is to show the evolution of Poincaré’s opinion on the mechanistic reduction of the principles of thermodynamics, placing it in the context of the science of his time. The second is to present some of his work in 1890 on the foundations of statistical mechanics. He became interested first in thermodynamics and its relation with mechanics, drawing on the work of Helm-holtz on monocyclic systems. After a period of skepticism concerning the kinetic theory, he read some of Maxwell’s memories and contributed to the foundations of statistical mechanics. I also show that PoincarĂ©'s contributions to the founda-tions of statistical mechanics are closely linked to his work in celestial mechanics and its interest in probability theory and its role in physics

    Formalization of the classification pattern: Survey of classification modeling in information systems engineering

    Get PDF
    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move towards formalization in part because it illustrates some of the barriers to formalization; including the formal complexity of the pattern and the ontological issues surrounding the ‘one and the many’. Powersets are a way of characterizing the (complex) formal structure of the classification pattern and their formalization has been extensively studied in mathematics since Cantor’s work in the late 19th century. One can use this formalization to develop a useful benchmark. There are various communities within Information Systems Engineering (ISE) that are gradually working towards a formalization of the classification pattern. However, for most of these communities this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other Information Systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature; starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE’s understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.The UK Engineering and Physical Sciences Research Council (grant EP/K009923/1)
    • 

    corecore